Chapter 27 Tail Inequalities

نویسنده

  • David Hilbert
چکیده

Theorem 27.1.2 (Chebychev inequality) Let X be a random variable with μx = E[X] and σx be the standard deviation of X. That is σX = E [ (X − μx) ] . Then, Pr [|X − μX | ≥ tσX] ≤ 1 t2 . Proof: Note that Pr [|X − μX | ≥ tσX] = Pr[(X − μX) ≥ t2σ2X] . Set Y = (X − μX). Clearly, E [ Y ] = σX. Now, apply Markov inequality to Y . This work is licensed under the Creative Commons Attribution-Noncommercial 3.0 License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc/3.0/ or send a letter to Creative Commons, 171 Second Street, Suite 300, San Francisco, California, 94105, USA.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Some Probability Inequalities for Quadratic Forms of Negatively Dependent Subgaussian Random Variables

In this paper, we obtain the upper exponential bounds for the tail probabilities of the quadratic forms for negatively dependent subgaussian random variables. In particular the law of iterated logarithm for quadratic forms of independent subgaussian random variables is generalized to the case of negatively dependent subgaussian random variables.

متن کامل

Sharp Inequalities for Polygamma Functions

where μ is a nonnegative measure on [0,∞) such that the integral (2) converges for all x > 0. This means that a function f(x) is completely monotonic on (0,∞) if and only if it is a Laplace transform of the measure μ. The completely monotonic functions have applications in different branches of mathematical sciences. For example, they play some role in combinatorics, numerical and asymptotic an...

متن کامل

Tail inequalities for sums of random matrices that depend on the intrinsic dimension

This work provides exponential tail inequalities for sums of random matrices that depend only on intrinsic dimensions rather than explicit matrix dimensions. These tail inequalities are similar to the matrix versions of the Chernoff bound and Bernstein inequality except with the explicit matrix dimensions replaced by a trace quantity that can be small even when the explicit dimensions are large...

متن کامل

Inequalities between hypergeometric tails

A special inequality between the tail probabilities of certain related hypergeometrics was shown by Seneta and Phipps [19] to suggest useful ‘quasi-exact’ alternatives to Fisher’s [5] Exact Test. With this result as motivation, two inequalities of Hájek and Havránek [6] are investigated in this paper and are generalised to produce inequalities in the form required. A parallel inequality in bino...

متن کامل

Chapter 26 Tail Inequalities

Theorem 26.1.2 (Chebychev inequality) Let X be a random variable with μx = E[X] and σx be the standard deviation of X. That is σX = E [ (X − μx) ] . Then, Pr [|X − μX | ≥ tσX] ≤ 1 t2 . Proof: Note that Pr [|X − μX | ≥ tσX] = Pr[(X − μX) ≥ t2σ2X] . Set Y = (X − μX). Clearly, E [ Y ] = σX. Now, apply Markov inequality to Y . This work is licensed under the Creative Commons Attribution-Noncommerci...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009